Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics
نویسنده
چکیده
منابع مشابه
Shannon entropy: a rigorous mathematical notion at the crossroads between probability, information theory, dynamical systems and statistical physics
Statistical entropy was introduced by Shannon as a basic concept in information theory, measuring the average missing information on a random source. Extended into an entropy rate, it gives bounds in coding and compression theorems. I here present how statistical entropy and entropy rate relate to other notions of entropy, relevant either to probability theory (entropy of a discrete probability...
متن کاملEntropy of infinite systems and transformations
The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...
متن کاملSome properties of the parametric relative operator entropy
The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...
متن کاملDynamics of Uncertainty in Nonequilibrium Random Motion
Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...
متن کاملCombinatorial entropies and statistics
We examine the combinatorial or probabilistic definition (“Boltzmann’s principle”) of the entropy or cross-entropy function H ∝ lnW or D ∝ − lnP, where W is the statistical weight and P the probability of a given realization of a system. Extremisation of H or D, subject to any constraints, thus selects the “most probable” (MaxProb) realization. If the system is multinomial, D converges asymptot...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Mathematical Structures in Computer Science
دوره 24 شماره
صفحات -
تاریخ انتشار 2014